A Theory of Self-organising Neural Networks

نویسنده

  • S P Luttrell
چکیده

The purpose of this paper is to present a generalisation of the probabilistic approach to the static analysis of selforganising neural networks that appeared in [1]. In the simplest case the network has two layers: an input and an output layer. An input vector is used to clamp the pattern of activity of the nodes in the input layer, and the resulting pattern of individual ring events of the nodes in the output layer is described probabilistically. Finally, an attempt is made to reconstruct the pattern of activity in the input layer from knowledge of the location of the ring events in the output layer. This inversion from output to input is achieved by using Bayes' theorem to invert the probabilistic feed-forward mapping from input to output. A network objective function is then introduced in order to optimise the overall network performance. If the average Euclidean error between an input vector and its corresponding reconstruction is used as the objective function, then many standard self-organising networks emerge as special cases [1, 2]. In Section II the network objective function is introduced, in Section III a simpler form is derived which is an upper bound to the true objective function, and in Section IV the derivatives with respect to various parameters of this upper bound are derived. Finally, in Section V various standard neural networks are analysed within this framework.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the equivalence between kernel self-organising maps and self-organising mixture density networks

The kernel method has become a useful trick and has been widely applied to various learning models to extend their nonlinear approximation and classification capabilities. Such extensions have also recently occurred to the Self-Organising Map (SOM). In this paper, two recently proposed kernel SOMs are reviewed, together with their link to an energy function. The Self-Organising Mixture Network ...

متن کامل

Automatic Classification using Self-Organising Neural Networks in Astrophysical Experiments

Self-Organising Maps (SOMs) are effective tools in classification problems, and in recent years the even more powerful Dynamic Growing Neural Networks, a variant of SOMs, have been developed. Automatic Classification (also called clustering) is an important and difficult problem in many Astrophysical experiments, for instance, Gamma Ray Burst classification, or gamma-hadron separation. After a ...

متن کامل

Self-generating Neural Networks

This review of recent advances conceringing growing and pruning neural networks focuses on three areas. Adaptive resonance theory networks automatically have an architecture whose size adjusts to its task. Networks that optimize resource allocation have been around nearly fifteen years and developments are still being made in growing and pruning strategies, particularly for on-line, real-time a...

متن کامل

Speech Processing Using Artificial Neural Networks

A three layer perceptron network is used to classify the /i/ sound using isolated words from diierent speakers. A classiication accuracy of 97% has been achieved. A map of phonemes is used to trace trajectories of utterances using the self-organising neural network. A crinkle factor is proposed which allows using the self-organising map to determine the inherent dimensionality of a set of point...

متن کامل

Gamma-filter self-organising neural networks for unsupervised sequence processing

Adding g-filters to self-organising neural networks for unsupervised sequence processing is proposed. The proposed g-context model is applied to self-organising maps and neural gas networks. The g-context model is a generalisation that includes as a particular example the previously published merge-context model. The results show that the g-context model outperforms the merge-context model in t...

متن کامل

Using artificial neural networks for solving chemical problems Kohonen self-organising feature maps and Hopfield networks

This second part of a Tutorial on neural networks focuses on the Kohonen self-organising feature map and the Hopfield network. First a theoretical description of each type is given. The practical issues concerning applications of the networks are then discussed. For each network, a description is given of the types of problems which can be tackled by the specific neural network, followed by a p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010